AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Ultra-long context

# Ultra-long context

ERNIE 4.5 300B A47B PT GGUF
Apache-2.0
ERNIE-4.5-300B-A47B is a multimodal pre-trained model based on the MoE architecture, with powerful text understanding and generation capabilities, and supports visual-language joint reasoning.
Large Language Model Transformers Supports Multiple Languages
E
gabriellarson
186
1
Qwen3 235B A22B INT4MIX
Apache-2.0
Qwen3-235B-A22B is the latest generation of Tongyi's large model series, offering a range of dense and mixture-of-experts (MoE) models. It has made breakthroughs in inference, instruction following, intelligent agent capabilities, and multilingual support.
Large Language Model Transformers
Q
fastllm
144
2
Qwen2 VL 72B Instruct GGUF
Other
Qwen2-VL-72B-Instruct-GGUF is a quantized version of the original model, supporting multimodal tasks and can be run through GaiaNet.
Image-to-Text Transformers English
Q
gaianet
1,803
0
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase